Smoothness of conditional independence models for discrete data

نویسنده

  • Antonio Forcina
چکیده

We investigate the family of conditional independence models require constraints on complete but non hierarchical marginal log-linear parameters; by exploiting results on the mixed parameterization, we show that they are smooth when the jacobian of a reconstruction algorithm has spectral radius strictly less than 1. This condition is always satisfies in simple contexts where only two marginals are involved. For the general case, we describe an efficient algorithm for checking whether the condition is satisfied with high probability; this approach is applied to assess smoothness of several conditional independence models.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Brief Introduction to Graphical Models and How to Learn Them from Data

Graphical Models: Core Ideas and Notions A Simple Example: How does it work in principle? Conditional Independence Graphs conditional independence and the graphoid axioms separation in (directed and undirected) graphs decomposition/factorization of distributions Evidence Propagation in Graphical Models Building Graphical Models Learning Graphical Models from Data quantitative (parameter) and qu...

متن کامل

Conditional Dependence in Longitudinal Data Analysis

Mixed models are widely used to analyze longitudinal data. In their conventional formulation as linear mixed models (LMMs) and generalized LMMs (GLMMs), a commonly indispensable assumption in settings involving longitudinal non-Gaussian data is that the longitudinal observations from subjects are conditionally independent, given subject-specific random effects. Although conventional Gaussian...

متن کامل

Financial Risk Modeling with Markova Chain

Investors use different approaches to select optimal portfolio. so, Optimal investment choices according to return can be interpreted in different models. The traditional approach to allocate portfolio selection called a mean - variance explains. Another approach is Markov chain. Markov chain is a random process without memory. This means that the conditional probability distribution of the nex...

متن کامل

Iterative Conditional Fitting for Discrete Chain Graph Models

‘Iterative conditional fitting’ is a recently proposed algorithm that can be used for maximization of the likelihood function in marginal independence models for categorical data. This paper describes a modification of this algorithm, which allows one to compute maximum likelihood estimates in a class of chain graph models for categorical data. The considered discrete chain graph models are def...

متن کامل

Bayesian Test of Significance for Conditional Independence: The Multinomial Model

Conditional independence tests have received special attention lately in machine learning and computational intelligence related literature as an important indicator of the relationship among the variables used by their models. In the field of probabilistic graphical models, which includes Bayesian network models, conditional independence tests are especially important for the task of learning ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • J. Multivariate Analysis

دوره 106  شماره 

صفحات  -

تاریخ انتشار 2012